# SLERP fusion model
Magnolia Mell V1 12B GGUF
A pre-trained language model fusion quantized GGUF file set created based on mergekit, using asymmetric gradient SLERP method to fuse two 12B parameter models, suitable for text generation tasks.
Large Language Model
Transformers

M
grimjim
182
1
Patricide 12B Unslop Mell
Apache-2.0
A 12B-parameter language model based on the SLERP fusion method, combining the strengths of Mag-Mell and UnslopNemo models, suitable for creative writing and role-playing scenarios
Large Language Model
Transformers

P
redrix
1,311
16
Cydonia V1.3 Magnum V4 22B
Other
This is a 22B-parameter language model merged from Cydonia-22B-v1.3 and Magnum-v4-22B using the SLERP fusion method
Large Language Model
Transformers

C
knifeayumu
822
41
Cydonia V1.2 Magnum V4 22B
Other
A 22B-parameter language model merged from Cydonia-22B-v1.2 and Magnum-v4-22b using the SLERP method
Large Language Model
Transformers

C
knifeayumu
52
18
Fimbulvetr Kuro Lotus 10.7B
This is a hybrid model created through SLERP (Spherical Linear Interpolation) fusion of two 10.7B parameter models, combining the strengths of Fimbulvetr and Kuro-Lotus, demonstrating excellent performance in text generation tasks.
Large Language Model
Transformers

F
saishf
57
18
Chupacabra 7B V2
Apache-2.0
A 7B-parameter large language model based on the Mistral architecture, utilizing SLERP fusion technology to merge weights from multiple high-performance models
Large Language Model
Transformers

C
perlthoughts
99
35
Featured Recommended AI Models